1,941 research outputs found

    ANTIBIOGRAM PROFILING OF HELICOBACTER PYLORI STRAINS AND THE EFFICACY OF BRASSICA CAPITATA AGAINST RESISTANT STRAINS ISOLATED FROM THE PATIENTS SUFFERING FROM GASTRODUODENAL DISEASES IN GUWAHATI, ASSAM

    Get PDF
      Objective: Helicobacter pylori resistance toward commonly used antibiotics is increasing leading to the treatment failure; hence, our aim is to determine the antibiogram susceptibility pattern of H. pylori strains isolated from Guwahati, Assam (Northeast India) and also to test the efficacy of the Brassica capitata against the multi and dual drug-resistant strains of North and Northeast India.Methods: Minimum inhibitory concentration of different antibiotics was determined by agar dilution method. Disc diffusion method was used to check the efficacy of B. capitata against clarithromycin (CLR), metronidazole (MTZ), and levofloxacin (LEV)-resistant H. pylori strains.Results: All the H. pylori strains were 100% sensitive to CLR, tetracycline, amoxicillin, and furazolidone. 72.8% of the strains were sensitive toward MTZ and 54.5% were sensitive toward LEV. B. capitata showed good efficacy against the resistant strains of H. pylori of North and Northeast India.Conclusion: Most of the H. pylori strains from Northeast India were sensitive toward the commonly used antibiotics for the treatment regime. B. capitata is effective against H. pylori infection, suggesting its potential as an alternative therapy, and opens the way for further studies on identification of novel antimicrobial targets of B. capitata

    Human-based approaches to pharmacology and cardiology: an interdisciplinary and intersectorial workshop

    Get PDF
    Both biomedical research and clinical practice rely on complex datasets for the physiological and genetic characterization of human hearts in health and disease. Given the complexity and variety of approaches and recordings, there is now growing recognition of the need to embed computational methods in cardiovascular medicine and science for analysis, integration and prediction. This paper describes a Workshop on Computational Cardiovascular Science that created an international, interdisciplinary and inter-sectorial forum to define the next steps for a human-based approach to disease supported by computational methodologies. The main ideas highlighted were (i) a shift towards human-based methodologies, spurred by advances in new in silico, in vivo, in vitro, and ex vivo techniques and the increasing acknowledgement of the limitations of animal models. (ii) Computational approaches complement, expand, bridge, and integrate in vitro, in vivo, and ex vivo experimental and clinical data and methods, and as such they are an integral part of human-based methodologies in pharmacology and medicine. (iii) The effective implementation of multi- and interdisciplinary approaches, teams, and training combining and integrating computational methods with experimental and clinical approaches across academia, industry, and healthcare settings is a priority. (iv) The human-based cross-disciplinary approach requires experts in specific methodologies and domains, who also have the capacity to communicate and collaborate across disciplines and cross-sector environments. (v) This new translational domain for human-based cardiology and pharmacology requires new partnerships supported financially and institutionally across sectors. Institutional, organizational, and social barriers must be identified, understood and overcome in each specific setting

    Human-based approaches to pharmacology and cardiology: an interdisciplinary and intersectorial workshop.

    Get PDF
    Both biomedical research and clinical practice rely on complex datasets for the physiological and genetic characterization of human hearts in health and disease. Given the complexity and variety of approaches and recordings, there is now growing recognition of the need to embed computational methods in cardiovascular medicine and science for analysis, integration and prediction. This paper describes a Workshop on Computational Cardiovascular Science that created an international, interdisciplinary and inter-sectorial forum to define the next steps for a human-based approach to disease supported by computational methodologies. The main ideas highlighted were (i) a shift towards human-based methodologies, spurred by advances in new in silico, in vivo, in vitro, and ex vivo techniques and the increasing acknowledgement of the limitations of animal models. (ii) Computational approaches complement, expand, bridge, and integrate in vitro, in vivo, and ex vivo experimental and clinical data and methods, and as such they are an integral part of human-based methodologies in pharmacology and medicine. (iii) The effective implementation of multi- and interdisciplinary approaches, teams, and training combining and integrating computational methods with experimental and clinical approaches across academia, industry, and healthcare settings is a priority. (iv) The human-based cross-disciplinary approach requires experts in specific methodologies and domains, who also have the capacity to communicate and collaborate across disciplines and cross-sector environments. (v) This new translational domain for human-based cardiology and pharmacology requires new partnerships supported financially and institutionally across sectors. Institutional, organizational, and social barriers must be identified, understood and overcome in each specific setting

    A high efficiency photon veto for the Light Dark Matter eXperiment

    Get PDF
    Fixed-target experiments using primary electron beams can be powerful discovery tools for light dark matter in the sub-GeV mass range. The Light Dark Matter eXperiment (LDMX) is designed to measure missing momentum in high-rate electron fixed-target reactions with beam energies of 4 GeV to 16 GeV. A prerequisite for achieving several important sensitivity milestones is the capability to efficiently reject backgrounds associated with few-GeV bremsstrahlung, by twelve orders of magnitude, while maintaining high efficiency for signal. The primary challenge arises from events with photo-nuclear reactions faking the missing-momentum property of a dark matter signal. We present a methodology developed for the LDMX detector concept that is capable of the required rejection. By employing a detailed Geant4-based model of the detector response, we demonstrate that the sampling calorimetry proposed for LDMX can achieve better than 10⁻¹³ rejection of few-GeV photons. This suggests that the luminosity-limited sensitivity of LDMX can be realized at 4 GeV and higher beam energies

    CMS distributed computing workflow experience

    Get PDF
    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation

    A high efficiency photon veto for the Light Dark Matter eXperiment

    Get PDF
    Fixed-target experiments using primary electron beams can be powerful discovery tools for light dark matter in the sub-GeV mass range. The Light Dark Matter eXperiment (LDMX) is designed to measure missing momentum in high-rate electron fixed-target reactions with beam energies of 4 GeV to 16 GeV. A prerequisite for achieving several important sensitivity milestones is the capability to efficiently reject backgrounds associated with few-GeV bremsstrahlung, by twelve orders of magnitude, while maintaining high efficiency for signal. The primary challenge arises from events with photo-nuclear reactions faking the missing-momentum property of a dark matter signal. We present a methodology developed for the LDMX detector concept that is capable of the required rejection. By employing a detailed Geant4-based model of the detector response, we demonstrate that the sampling calorimetry proposed for LDMX can achieve better than 10⁻¹³ rejection of few-GeV photons. This suggests that the luminosity-limited sensitivity of LDMX can be realized at 4 GeV and higher beam energies

    Search for the associated production of the Higgs boson with a top-quark pair

    Get PDF
    A search for the standard model Higgs boson produced in association with a top-quark pair (tt¯H) is presented, using data samples corresponding to integrated luminosities of up to 5.1 fb−1 and 19.7 fb−1 collected in pp collisions at center-of-mass energies of 7 TeV and 8 TeV respectively. The search is based on the following signatures of the Higgs boson decay: H → hadrons, H → photons, and H → leptons. The results are characterized by an observed tt¯H signal strength relative to the standard model cross section, μ=σ/σ SM,under the assumption that the Higgs boson decays as expected in the standard model. The best fit value is μ = 2.8 ± 1.0 for a Higgs boson mass of 125.6 GeV.National Science Foundation (U.S.

    Chaste : Cancer, Heart and Soft Tissue Environment

    Get PDF
    Funding: UK Engineering and Physical Sciences Research Council [grant number EP/N509711/1 (J.K.)].Chaste (Cancer, Heart And Soft Tissue Environment) is an open source simulation package for the numerical solution of mathematical models arising in physiology and biology. To date, Chaste development has been driven primarily by applications that include continuum modelling of cardiac electrophysiology (‘Cardiac Chaste’), discrete cell-based modelling of soft tissues (‘Cell-based Chaste’), and modelling of ventilation in lungs (‘Lung Chaste’). Cardiac Chaste addresses the need for a high-performance, generic, and verified simulation framewor kfor cardiac electrophysiology that is freely available to the scientific community. Cardiac chaste provides a software package capable of realistic heart simulations that is efficient, rigorously tested, and runs on HPC platforms. Cell-based Chaste addresses the need for efficient and verified implementations of cell-based modelling frameworks, providing a set of extensible tools for simulating biological tissues. Computational modelling, along with live imaging techniques, plays an important role in understanding the processes of tissue growth and repair. A wide range of cell-based modelling frameworks have been developed that have each been successfully applied in a range of biological applications. Cell-based Chaste includes implementations of the cellular automaton model, the cellular Potts model, cell-centre models with cell representations as overlapping spheres or Voronoi tessellations, and the vertex model. Lung Chaste addresses the need for a novel, generic and efficient lung modelling software package that is both tested and verified. It aims to couple biophysically-detailed models of airway mechanics with organ-scale ventilation models in a package that is freely available to the scientific community.Publisher PDFPeer reviewe

    Search for standard model production of four top quarks in the lepton + jets channel in pp collisions at √s = 8 TeV

    Get PDF
    A search is presented for standard model (SM) production of four top quarks (tt¯tt¯) in pp collisions in the lepton + jets channel. The data correspond to an integrated luminosity of 19.6 fb[superscript −1] recorded at a centre-of-mass energy of 8 TeV with the CMS detector at the CERN LHC. The expected cross section for SM tt¯tt¯ production is σSMtt¯tt¯≈1fb . A combination of kinematic reconstruction and multivariate techniques is used to distinguish between the small signal and large background. The data are consistent with expectations of the SM, and an upper limit of 32 fb is set at a 95% confidence level on the cross section for producing four top quarks in the SM, where a limit of 32 ± 17 fb is expected.National Science Foundation (U.S.)United States. Dept. of Energ

    Measurement of the elliptic anisotropy of charged particles produced in PbPb collisions at √sNN=2.76 TeV

    Get PDF
    The anisotropy of the azimuthal distributions of charged particles produced in [√ over s[subscript NN]]=2.76 TeV PbPb collisions is studied with the CMS experiment at the LHC. The elliptic anisotropy parameter, v[subscript 2], defined as the second coefficient in a Fourier expansion of the particle invariant yields, is extracted using the event-plane method, two- and four-particle cumulants, and Lee-Yang zeros. The anisotropy is presented as a function of transverse momentum (p[subscript T]), pseudorapidity (η) over a broad kinematic range, 0.3<p[subscript T]<20 GeV/c, |η|<2.4, and in 12 classes of collision centrality from 0 to 80%. The results are compared to those obtained at lower center-of-mass energies, and various scaling behaviors are examined. When scaled by the geometric eccentricity of the collision zone, the elliptic anisotropy is found to obey a universal scaling with the transverse particle density for different collision systems and center-of-mass energies
    corecore